- Title
- A component based approach for classifying the seven universal facial expressions of emotion
- Creator
- Hong, Kenny; Chalup, Stephan K.; King, Robert A. R.
- Relation
- 2013 IEEE Symposium on Computational Intelligence for Creativity and Affective Computing (CICAC). Proceedings of 2013 IEEE Symposium on Computational Intelligence for Creativity and Affective Computing (CICAC) (Singapore 16-19 April, 2013) p. 1-8
- Publisher Link
- http://dx.doi.org/10.1109/CICAC.2013.6595214
- Publisher
- Institute of Electrical and Electronics Engineers (IEEE)
- Resource Type
- conference paper
- Date
- 2013
- Description
- In a previous publication we showed a component approach performing better than a holistic approach for classifying 6 discrete facial expressions of emotion (happy, sad, surprised, fearful, disgusted and angry) plus the neutral face. In this paper, we present the impact on the two approaches if a seventh facial expression is included. This seventh expression is known as the contemptuous expression (an expression that has rarely been explored in the machine learning literature). Together, the 7 facial expressions are known in the psychology literature as Ekman's Universal Facial Expressions of Emotion. To replicate our previous experiments, we use the same number of image preprocessing techniques (grayscale, Local Binary Patterns, Sobel and Canny edge detection, and manually inserted feature points which we refer to as components) that are organised into four approaches: holistic, holistic action, component and component action (where an action approach is produced by a difference between a neutral and an expressive face). The facial expressions are classified using a standard multiclass Support Vector Machine (SVM) and a pairwise adaptive multiclass SVM (pa-SVM) - which uses pairwise adaptive model parameters. The significance of this study is to provide a thorough understanding of the choices that impact a classifier's performance and the performance dynamics between expression pairs when the contemptuous expression is now considered. In particular, we extend the issues and justify the use of the contemptuous expression by noting its universal presence in human expression and its asymmetric nature, which is not shared with other expressions. Using our face model, which uses feature points around the eyes, brows and mouth, we obtained a best correct classification rate of 98.57% with the contemptuous expression included.
- Subject
- Canny edge detection; Ekman universal facial emotion expression classification; grayscale; holistic action approach; image preprocessing techniques; local binary patterns; machine learning literature; neutral face; pa-SVM; pairwise adaptive model parameters; pairwise adaptive multiclass SVM; standard multiclass support vector machine; Sobel edge detection; classifier performance; component action approach; component approach; component based approach; contemptuous expression; expression pair performance dynamics; expressive face
- Identifier
- http://hdl.handle.net/1959.13/1042955
- Identifier
- uon:14150
- Rights
- This material is posted here with permission of the IEEE. Such permission of the IEEE does not in any way imply IEEE endorsement of any of University of Newcastle's products or services. Internal or personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution must be obtained from the IEEE by writing to pubs-permissions@ieee.org. By choosing to view this document, you agree to all provisions of the copyright laws protecting it.
- Language
- eng
- Full Text
- Reviewed
- Hits: 5020
- Visitors: 5802
- Downloads: 820
Thumbnail | File | Description | Size | Format | |||
---|---|---|---|---|---|---|---|
View Details Download | ATTACHMENT02 | Author final version | 759 KB | Adobe Acrobat PDF | View Details Download |